Chinese Academy of Sciences Launches SpikingBrain, a Brain-like Large Model: Achieving 100x Speed Breakthrough with 2% of the Data
Recently, Li Guoqi and Xu Bo's team from the Institute of Automation, Chinese Academy of Sciences, jointly released the world's first large-scale brain-like spiking large model - SpikingBrain1.0. The model demonstrates astonishing speed in processing long texts, capable of processing ultra-long texts of 4 million tokens at more than 100 times the speed of current mainstream Transformer models, while requiring only 2% of the data. Current mainstream large language models, such as the GPT series, are generally based on Transformer architecture.